52 research outputs found

    Dedicated maintenance and repair shop control for spare parts networks

    Full text link
    We study a repairable inventory system dedicated to a single component that is critical in operating a capital good. The system consists of a stock point containing spare components, and a dedicated repair shop responsible for repairing damaged components. Components are replaced using an age-replacement strategy, which sends components to the repair shop either preventively if it reaches the age-threshold, and correctively otherwise. Damaged components are replaced by new ones if there are spare components available, otherwise the capital good is inoperable. If there is free capacity in the repair shop, then the repair of the damaged component immediately starts, otherwise it is queued. The manager decides on the number of repairables in the system, the age-threshold, and the capacity of the repair shop. There is an inherent trade-off: A low (high) age-threshold reduces (increases) the probability of a corrective replacement but increases (decreases) the demand for repair capacity, and a high (low) number of repairables in the system leads to higher (lower) holding costs, but decreases (increases) the probability of downtime. We first show that the single capital good setting can be modelled as a closed queuing network with finite population, which we show is equivalent to a single queue with fixed capacity and state-dependent arrivals. For this queue, we derive closed-form expressions for the steady-state distribution. We subsequently use these results to approximate performance measures for the setting with multiple capital goods

    On the performance of overlapping and non-overlapping temporal demand aggregation approaches

    Get PDF
    Temporal demand aggregation has been shown in the academic literature to be an intuitively appealing and effective approach to deal with demand uncertainty for fast moving and intermittent moving items. There are two different types of temporal aggregation: non-overlapping and overlapping. In the former case, the time series are divided into consecutive non-overlapping buckets of time where the length of the time bucket equals the aggregation level. The latter case is similar to a moving window technique where the window's size is equal to the aggregation level. At each period, the window is moved one step ahead, so the oldest observation is dropped and the newest is included. In a stock-control context, the aggregation level is generally set to equal the lead-time. In this paper, we analytically compare the statistical performance of the two approaches. By means of numerical and empirical investigations, we show that unless the demand history is short, there is a clear advantage of using overlapping blocks instead of the non-overlapping approach. It is also found that the margin of this advantage becomes greater for longer lead-times

    Forecasting of compound Erlang demand

    Get PDF
    Intermittent demand items dominate service and repair inventories in many industries and they are known to be the source of dramatic inefficiencies in the defence sector. However, research in forecasting such items has been limited. Previous work in this area has been developed upon the assumption of a Bernoulli or a Poisson demand arrival process. Nevertheless, intermittent demand patterns may often deviate from the memory-less assumption. In this work we extend analytically previous important results to model intermittent demand based on a compound Erlang process, and we provide a comprehensive categorisation scheme to be used for forecasting purposes. In a numerical investigation we assess the benefit of departing from the memory-less assumption and we provide insights into how the degree of determinism inherent in the process affects forecast accuracy. Operationalised suggestions are offered to managers and software manufacturers dealing with intermittent demand items

    On the inventory performance of demand forecasting methods of medical items in humanitarian operations

    Get PDF
    The inventory management of medical items in humanitarian operations is a challenging task due to the intermittent nature of their demand and long replenishment lead-times. While effective response to emergency results in inventory build-up which saves human lives, excess inventories could be intentionally burnt or donated which is costly for humanitarian organizations. Henceforth, linking demand forecasting to the inventory control task is shown to be a significant scope to offer a higher performance. In this vein, it is key to accurately select adequate forecasting methods. This paper investigates the effectiveness of parametric and non-parametric demand forecasting methods that are commonly considered to deal with stock keeping units (SKUs) characterized with an intermittent demand in industrial contexts. To do so, we conduct an empirical study by means of data related to 1254 SKUs managed in three warehouses of a major humanitarian organization based in Geneva, Middle-east and Africa. The investigation is carried out to compare the inventory performance of three parametric and two bootstrapping methods when used with an order-up-to-level inventory control policy. The results demonstrate the high performance of the bootstrapping methods in achieving higher service levels. The investigation enables to gain insights on the forecasting method that should be selected under particular assumptions on the demand and the lead-time value

    Revisiting the value of information sharing in two-stage supply chains

    Get PDF
    There is a substantive amount of literature showing that demand information sharing can lead to considerable reduction of the bullwhip effect/inventory costs. The core argument/analysis underlying these results is that the downstream supply-chain member (the retailer) quickly adapts its inventory position to an updated end-customer demand forecast. However, in many real-life situations, retailers adapt slowly rather than quickly to changes in customer demand as they cannot be sure that any change is structural. In this paper, we show that the adaption speed and underlying (unknown) demand process crucially effect the value of information sharing. For the situation with a single upstream supply-chain member (manufacturer) and a single retailer, we consider two demand processes: stationary or random walk. These represent two extremes where a change in customer demand is never or always structural, respectively. The retailer and manufacturer both forecast demand using a moving average, where the manufacturer bases its forecast on retailer demand without information sharing, but on end-customer demand with information sharing. In line with existing results, the value of information turns out to be positive under stationary demand. One contribution, though, is showing that some of the existing papers have overestimated this value by making an unfair comparison. Our most striking and insightful finding is that the value of information is negative when demand follows a random walk and the retailer is slow to react. Slow adaptation is the norm in real-life situations and deserves more attention in future research - exploring when information sharing indeed pays off

    Forecasting intermittent inventory demands: simple parametric methods vs. bootstrapping

    Get PDF
    Although intermittent demand items dominate service and repair parts inventories in many industries, research in forecasting such items has been limited. A critical research question is whether one should make point forecasts of the mean and variance of intermittent demand with a simple parametric method such as simple exponential smoothing or else employ some form of bootstrapping to simulate an entire distribution of demand during lead time. The aim of this work is to answer that question by evaluating the effects of forecasting on stock control performance in more than 7,000 demand series. Tradeoffs between inventory investment and customer service show that simple parametric methods perform well, and it is questionable whether bootstrapping is worth the added complexity

    Demand forecasting by temporal aggregation

    Get PDF
    Demand forecasting performance is subject to the uncertainty underlying the time series an organization is dealing with. There are many approaches that may be used to reduce uncertainty and thus to improve forecasting performance. One intuitively appealing such approach is to aggregate demand in lower-frequency “time buckets.” The approach under concern is termed to as temporal aggregation, and in this article, we investigate its impact on forecasting performance. We assume that the nonaggregated demand follows either a moving average process of order one or a first-order autoregressive process and a single exponential smoothing (SES) procedure is used to forecast demand. These demand processes are often encountered in practice and SES is one of the standard estimators used in industry. Theoretical mean-squared error expressions are derived for the aggregated and nonaggregated demand to contrast the relevant forecasting performances. The theoretical analysis is supported by an extensive numerical investigation and experimentation with an empirical dataset. The results indicate that performance improvements achieved through the aggregation approach are a function of the aggregation level, the smoothing constant, and the process parameters. Valuable insights are offered to practitioners and the article closes with an agenda for further research in this area

    The impact of temporal aggregation on supply chains with ARMA(1,1) demand processes

    Get PDF
    Various approaches have been considered in the literature to improve demand forecasting in supply chains. Among these approaches, non-overlapping temporal aggregation has been shown to be an effective approach that can improve forecast accuracy. However, the benefit of this approach has been shown only under single exponential smoothing (when it is a non-optimal method) and no theoretical analysis has been conducted to look at the impact of this approach under optimal forecasting. This paper aims to bridge this gap by analysing the impact of temporal aggregation on supply chain demand and orders when optimal forecasting is used. To do so, we consider a two-stage supply chain (e.g. a retailer and a manufacturer) where the retailer faces an autoregressive moving average demand process of order (1,1) -ARMA(1,1)- that is forecasted by using the optimal Minimum Mean Squared Error (MMSE) forecasting method. We derive the analytical expressions of the mean squared forecast error (MSE) at the retailer and the manufacturer levels as well as the bullwhip ratio when the aggregation approach is used. We numerically show that, although the aggregation approach leads to an accuracy loss at the retailer's level, it may result in a reduction of the MSE at the manufacturer level up to 90% and a reduction of the bullwhip effect in the supply chain that can reach up to 84% for high lead-times
    corecore